1,595 research outputs found

    Phase transition in PCA with missing data: Reduced signal-to-noise ratio, not sample size!

    Full text link
    How does missing data affect our ability to learn signal structures? It has been shown that learning signal structure in terms of principal components is dependent on the ratio of sample size and dimensionality and that a critical number of observations is needed before learning starts (Biehl and Mietzner, 1993). Here we generalize this analysis to include missing data. Probabilistic principal component analysis is regularly used for estimating signal structures in datasets with missing data. Our analytic result suggests that the effect of missing data is to effectively reduce signal-to-noise ratio rather than - as generally believed - to reduce sample size. The theory predicts a phase transition in the learning curves and this is indeed found both in simulation data and in real datasets.Comment: Accepted to ICML 2019. This version is the submitted pape

    Stilhed før fremtiden

    Get PDF

    Boltzmann learning of parameters in cellular neural networks

    Get PDF

    Adaptive Regularization in Neural Network Modeling

    Get PDF
    . In this paper we address the important problem of optimizing regularization parameters in neural network modeling. The suggested optimization scheme is an extended version of the recently presented algorithm [24]. The idea is to minimize an empirical estimate -- like the cross-validation estimate -- of the generalization error with respect to regularization parameters. This is done by employing a simple iterative gradient descent scheme using virtually no additional programming overhead compared to standard training. Experiments with feed-forward neural network models for time series prediction and classification tasks showed the viability and robustness of the algorithm. Moreover, we provided some simple theoretical examples in order to illustrate the potential and limitations of the proposed regularization framework. 1 Introduction Neural networks are flexible tools for time series processing and pattern recognition. By increasing the number of hidden neurons in a 2-layer architec..

    On an emotional node: modeling sentiment in graphs of action verbs

    Get PDF

    Cognitive semantic networks: emotional verbs throw a tantrum but don't bite

    Get PDF
    • …
    corecore